Bridging AIC and BIC: A New Criterion for Autoregression
نویسندگان
چکیده
To address order selection for an autoregressive model fitted to time series data, we propose a new information criterion. It has the benefits of the two wellknown model selection techniques, the Akaike information criterion and the Bayesian information criterion. When the data is generated from a finite order autoregression, the Bayesian information criterion is known to be consistent, and so is the new criterion. When the true order is infinity or suitably high with respect to the sample size, the Akaike information criterion is known to be efficient in the sense that its predictive performance is asymptotically equivalent to the best offered by the candidate models; in this case, the new criterion behaves in a similar manner. Different from the two classical criteria, the proposed criterion adaptively achieves either consistency or efficiency depending on the underlying true model. In practice where the observed time series is given without any prior information about the model specification, the proposed order selection criterion is more flexible and reliable compared with classical approaches. Numerical results are presented demonstrating the adaptivity of the proposed technique when applied to various datasets.
منابع مشابه
Can the strengths of AIC and BIC be shared?
It is well known that AIC and BIC have different properties in model selection. BIC is consistent in the sense that if the true model is among the candidates, the probability of selecting the true model approaches 1. On the other hand, AIC is minimax-rate optimal for both parametric and nonparametric cases for estimating the regression function. There are several successful results on construct...
متن کاملBayesian information criterion for longitudinal and clustered data.
When a number of models are fit to the same data set, one method of choosing the 'best' model is to select the model for which Akaike's information criterion (AIC) is lowest. AIC applies when maximum likelihood is used to estimate the unknown parameters in the model. The value of -2 log likelihood for each model fit is penalized by adding twice the number of estimated parameters. The number of ...
متن کاملNew g%AIC, g%AICc, g%BIC, and Power Divergence Fit Statistics Expose Mating between Modern Humans, Neanderthals and other Archaics
The purpose of this article is to look at how information criteria, such as AIC and BIC, interact with the g%SD fit criterion derived in Waddell et al. (2007, 2010a). The g%SD criterion measures the fit of data to model based on a normalized weighted root mean square percentage deviation between the observed data and model estimates of the data, with g%SD = 0 being a perfectly fitting model. Ho...
متن کاملA New Application of Hidden Markov Model in Exchange Rate Forecasting
This paper presents a new application of Hidden Markov Model (HMM) as a forecasting tool for the prediction of the currency exchange rate between the US dollar and the euro. The results obtained show that the difference between price gaps which consists open, high, and low price can be selected to produce the best model parameter of Hidden Markov Model. Three model parameters based on Akaike In...
متن کاملGeometric BIC
The author introduced the “geometric AIC” and the “geometric MDL” as model selection criteria for geometric fitting problems. These correspond to Akaike’s “AIC” and Rissanen’s “BIC”, respectively, well known in the statistical estimation framework. Another criterion well known is Schwarz’ “BIC”, but its counterpart for geometric fitting has been unknown. This paper introduces the corresponding ...
متن کامل